158 research outputs found

    Numerical Sensitivity and Efficiency in the Treatment of Epistemic and Aleatory Uncertainty

    Get PDF
    The treatment of both aleatory and epistemic uncertainty by recent methods often requires an high computational effort. In this abstract, we propose a numerical sampling method allowing to lighten the computational burden of treating the information by means of so-called fuzzy random variables

    Computing expectations with p-boxes: two views of the same problem

    Get PDF
    International audienceGiven an imprecise probabilistic model over a continuous space, computing lower (upper) expectations is often computationally hard to achieve, even in simple cases. Building tractable methods to do so is thus a crucial point in applications. In this paper, we concentrate on p-boxes (a simple and popular model), and on lower expectations computed over non-monotone functions. For various particular cases, we propose tractable methods to compute approximations or exact values of these lower expectations. We found interesting to put in evidence and to compare two approaches: the first using general linear programming, and the second using the fact that p-boxes are special cases of random sets. We underline the complementarity of both approaches, as well as the differences

    Other uncertainty theories based on capacities

    Get PDF
    International audienceThe two main uncertainty representations in the literature that tolerate imprecision are possibility distributions and random disjunctive sets. This chapter devotes special attention to the theories that have emerged from them. The first part of the chapter discusses epistemic logic and derives the need for capturing imprecision in information representations. It bridges the gap between uncertainty theories and epistemic logic showing that imprecise probabilities subsume modalities of possibility and necessity as much as probability. The second part presents possibility and evidence theories, their origins, assumptions and semantics, discusses the connections between them and the general framework of imprecise probability. Finally, chapter points out the remaining discrepancies between the different theories regarding various basic notions, such as conditioning, independence or information fusion and the existing bridges between them

    Interval analysis on non-linear monotonic systems as an efficient tool to optimise fresh food packaging

    Get PDF
    IATE Axe 5 : Application intégrée de la connaissance, de l’information et des technologies permettant d’accroître la qualité et la sécurité des alimentsInternational audienceWhen few data or information are available, the validity of studies performing uncertainty analysis or robust design optimisation (i.e., parameter optimisation under uncertainty) with a probabilistic approach is questionable. This is particularly true in some agronomical fields, where parameter and variable uncertainties are often quantified by a handful of measurements or by expert opinions. In this paper, we propose a simple alternative approach based on interval analysis, which avoids the pitfalls of a classical probabilistic approach. We propose simple methods to achieve uncertainty propagation, parameter optimisation and sensitivity analysis in cases where the model satisfies some monotonic properties. As a real-world case study, we interest ourselves to the application developed in our laboratory that has motivated the present work, that is the design of sustainable food packaging preserving fresh fruits and vegetables as long as possible

    Special Cases

    Get PDF
    International audienceThis chapter reviews special cases of lower previsions, that are instrumental in practical applications. We emphasize their various advantages and drawbacks, as well as the kind of problems in which they can be the most useful

    How to Handle Missing Values in Multi-Criteria Decision Aiding?

    Get PDF
    International audienceIt is often the case in the applications of Multi-Criteria Decision Making that the values of alternatives are unknown on some attributes. An interesting situation arises when the attributes having missing values are actually not relevant and shall thus be removed from the model. Given a model that has been elicited on the complete set of attributes, we are looking thus for a way-called restriction operator-to automatically remove the missing attributes from this model. Axiomatic characterizations are proposed for three classes of models. For general quantitative models, the restriction operator is characterized by linearity, recursivity and decomposition on variables. The second class is the set of monotone quantitative models satisfying normal-ization conditions. The linearity axiom is changed to fit with these conditions. Adding recursivity and symmetry, the restriction operator takes the form of a normalized average. For the last class of models-namely the Choquet integral, we obtain a simpler expression. Finally, a very intuitive interpretation is provided for this last model

    Idempotent conjunctive combination of belief functions: Extending the minimum rule of possibility theory.

    Get PDF
    IATE : Axe 5 Application intĂ©grĂ©e de la connaissance, de l’information et des technologies permettant d’accroĂ®tre la qualitĂ© et la sĂ©curitĂ© des aliments Contact : [email protected] (S. Destercke), [email protected] (D. Dubois) Fax: +33 0 4 9961 3076.International audienceWhen conjunctively merging two belief functions concerning a single variable but coming from different sources, Dempster rule of combination is justified only when information sources can be considered as independent. When dependencies between sources are ill-known, it is usual to require the property of idempotence for the merging of belief functions, as this property captures the possible redundancy of dependent sources. To study idempotent merging, different strategies can be followed. One strategy is to rely on idempotent rules used in either more general or more specific frameworks and to study, respectively, their particularisation or extension to belief functions. In this paper, we study the feasibility of extending the idempotent fusion rule of possibility theory (the minimum) to belief functions. We first investigate how comparisons of information content, in the form of inclusion and least-commitment, can be exploited to relate idempotent merging in possibility theory to evidence theory. We reach the conclusion that unless we accept the idea that the result of the fusion process can be a family of belief functions, such an extension is not always possible. As handling such families seems impractical, we then turn our attention to a more quantitative criterion and consider those combinations that maximise the expected cardinality of the joint belief functions, among the least committed ones, taking advantage of the fact that the expected cardinality of a belief function only depends on its contour function

    Représentation et combinaison d'informations incertaines : nouveaux résultats avec applications aux études de sûreté nucléaires

    Get PDF
    It often happens that the value of some parameters or variables of a system are imperfectly known, either because of the variability of the modelled phenomena, or because the availableinformation is imprecise or incomplete. Classical probability theory is usually used to treat these uncertainties. However, recent years have witnessed the appearance of arguments pointing to the conclusion that classical probabilities are inadequate to handle imprecise or incomplete information. Other frameworks have thus been proposed to address this problem: the three main are probability sets, random sets and possibility theory. There are many open questions concerning uncertainty treatment within these frameworks. More precisely, it is necessary to build bridges between these three frameworks to advance toward a unified handlingof uncertainty. Also, there is a need of practical methods to treat information, as using these framerowks can be computationally costly. In this work, we propose some answers to these two needs for a set of commonly encountered problems. In particular, we focus on the problems of:- Uncertainty representation- Fusion and evluation of multiple source information- Independence modellingThe aim being to give tools (both of theoretical and practical nature) to treat uncertainty. Some tools are then applied to some problems related to nuclear safety issues.Souvent, les valeurs de certains paramètres ou variables d'un système ne sont connues que de façon imparfaite, soit du fait de la variabilité des phénomènes physiques que l'on cherche à représenter,soit parce que l'information dont on dispose est imprécise, incomplète ou pas complètement fiable.Usuellement, cette incertitude est traitée par la théorie classique des probabilités. Cependant, ces dernières années ont vu apparaître des arguments indiquant que les probabilités classiques sont inadéquates lorsqu'il faut représenter l'imprécision présente dans l'information. Des cadres complémentaires aux probabilités classiques ont donc été proposés pour remédier à ce problème : il s'agit, principalement, des ensembles de probabilités, des ensembles aléatoires et des possibilités. Beaucoup de questions concernant le traitement des incertitudes dans ces trois cadres restent ouvertes. En particulier, il est nécessaire d'unifier ces approches et de comprendre les liens existants entre elles, et de proposer des méthodes de traitement permettant d'utiliser ces approches parfois cher en temps de calcul. Dans ce travail, nous nous proposons d'apporter des réponses à ces deux besoins pour une série de problème de traitement de l'incertain rencontré en analyse de sûreté. En particulier, nous nous concentrons sur les problèmes suivants :- Représentation des incertitudes- Fusion/évaluation de données venant de sources multiples- Modélisation de l'indépendanceL'objectif étant de fournir des outils, à la fois théoriques et pratiques, de traitement d'incertitude. Certains de ces outils sont ensuite appliqués à des problèmes rencontrés en sûreté nucléaire

    Parameters uncertainties and error propagation in modified atmosphere packaging modelling

    Get PDF
    IATE Axe 5 : Application intégrée de la connaissance, de l’information et des technologies permettant d’accroître la qualité et la sécurité des aliments Publication Inra prise en compte dans l'analyse bibliométrique des publications scientifiques mondiales sur les Fruits, les Légumes et la Pomme de terre. Période 2000-2012. http://prodinra.inra.fr/record/256699International audienceMathematical models are instrumental tools to predict gas (O2 and CO2) evolution in headspaces of Modified Atmosphere Packaging (MAP). Such models simplify the package design steps as they allow engineers to estimate the optimal values of packaging permeability for maintaining the quality and safety of the packed food. However, these models typically require specifying several input parameter values (such as maximal respiration rates) that are obtained from experimental data and are characterized by high uncertainties due to biological variation. Although treating and modelling this uncertainty is essential to ensure the robustness of designed MAPs, this subject has seldom been considered in the literature. In this work, we describe an optimisation system based on a MAP mathematical model that determines optimal permeabilities of packaging, given certain food parameters. To integrate uncertainties in the model while keeping the optimisation computational burden relatively low, we propose to use an approach based on interval analysis rather than the more classical probabilistic approach. The approach has two advantages: it makes a minimal amount of unverified assumption concerning uncertainties, and it requires only a few evaluations of the model. The results of these uncertainty studies are optimal values of permeabilities described by fuzzy sets. This approach was conducted on three case studies: chicory, mushrooms and blueberry. Sensitivity analysis on input parameters in the model MAP was also performed in order to point out that parameter influences are dependent on the considered fruit or vegetable. A comparison of the interval analysis methodology with the probabilistic one (known as Monte Carlo) was then performed and discussed
    • …
    corecore